More dwsLinq improvements: query sources and JSON
When I started writing dwsLinq, I never intended it to just be an integrated SQL query builder. I did that first because it was both useful and easy to implement, but that was never the end goal.
Over the last few days I’ve taken a big step towards the real goal: developing a general-purpose data querying system. I factored out all of the SQL-specific stuff into its own unit and created an interface called IQueryBuilder that responds to the various query operators to build an expression tree that can be evaluated by DWS. You create a recognizer and register it with dwsLinq, that checks the value of your FROM expression and returns an IQueryBuilder if it can create an expression for it. And once I had that working, I created a second unit with a new implementation of IQueryBuilder, for querying JSON data.
I checked it in recently, and it supports all of the query operators except JOIN (which raises a compiler error, since I couldn’t figure out how to do in any meaningful way without language support for anonymous methods) and DISTINCT (which is a no-op that raises a compiler warning). It takes a JSONVariant value (supplied by the JSON Connector language extension) as input and produces a new one as output. The INTO operator expects a JSONVariant as input and can output any type. (There’s no special to distinguish array types from single elements, like with the SQL version, because this is a lot more complicated to pull off without an enforced schema. I might change that later on.)
The JSON query system is still very rough, and very much a work in progress. It’s a lot harder to get things right than SQL, because SQL relations have a schema and JSON does not. (It’s essentially the same as the difference between ensuring type safety in a static language or a dynamic one. Getting it right in the dynamic language is a lot more work!)
The query system expects that the input value will consist of an array of objects (much like a SQL result set) and should work more or less as expected under those conditions. I’ve tried to put in cases for the various query operators to handle them, but it still needs work, and I’ll keep working on it.
But what this means is that anyone could write an IQueryBuilder implementation and their own query expression objects, to be able to run LINQ expressions against arbitrary data. This is starting to become a really interesting project…
I suspect optionally integrate the SQLite3 engine could be a great idea.
With the SQLite3 virtual table mechanism, you are able to execute a SQL statement over any kind of data, even plain DWS or Delphi objects.
And it is able to run JOINed queries, among any kind of data.
We use this virtual table mechanism intensively for our little mORMOt, and it works great.
There are already pure Delphi classes to abstract this virtual table mechanism, including very efficient JSON marshaling.
We use virtual tables to access external database tables or good old TObjectList.
Performance is great – see http://blog.synopse.info/post/2012/07/25/Synopse-mORMot-benchmark and newer http://blog.synopse.info/post/2013/02/12/Introducing-ZEOS%2C-UniDAC%2C-NexusDB%2C-BDE%2C-any-TDataset-to-SynDB-and-mORMot-s-ORM – those links state that using the virtual mechanism of SQLite3 is efficient, even if mORMot allows direct access, if possible.
… in fact, on second thoughts, it could be a good idea to use mORMot ORM and Client-Server abilities within DWS.
Since mORMot DB layer is already optionally integrated, it may be worth integrating other part of the framework.
Your dwsLINQ could make easy integration of mORMOt, via IQueryBuilder interface.
Sure! It’s all checked in to the DWS repository. Feel free to write an IQueryBuilder implementation for mORMOt. Just be aware that the whole system is still under development, and the definition of IQueryBuilder is subject to change without notice.
Yeah, Eric mentioned your work with virtual tables to me when I was first discussing the possibility of a LINQ implementation with him. But I don’t think they would be helpful in this particular case. The problem is that SQL tables require a schema, and JSON does not. Without some way to enforce a predictable structure on JSON data, your ability to work with it using tools that require a predictable structure is very limited.